Skip to Main Content
University of York Library
Library Subject Guides

Digital Creativity: a Practical Guide

AI generation

A practical guide to getting digitally creative and using digital tools and technologies to explore work, ideas, and research.

Using AI generation tools

Artificial intelligence, or AI for short, has brought a range of opportunities (and some issues). We'll explore the world of AI generation tools and some of the considerations you need to bear in mind when using these tools.

What are AI generation tools?

Artificial intelligence, or AI for short, involves using computers to do tasks which mimic human intelligence in some way. There are a vast range of methods of creating artificial intelligence and applications for AI.

One area in which AI has been used is for generating content that is similar to other content, sometimes known as generative artificial intelligence or generative AI. This is often done using a branch of artificial intelligence known as machine learning, which focuses on getting computers to appear to learn how to do something, like recognise or generate certain kinds of content. Machine learning can be supervised, where it is given datasets and told to learn from them, or unsupervised, where the computer is given a goal and parameters and has to try and reach that goal without example data.

AI generation tools are applications, usually web-based, which allow you to harness the power of this AI generation without needing to know how to create AI or machine learning apps yourself. These tools can do a range of things (you've probably seen some of them in action, especially text generation tools like ChatGPT or Microsoft Copilot), but also come with many caveats and restrictions. Typically, you give these tools some kind of prompt, maybe using text and/or images, and it 'generates' content in return.

Asking Copilot to list 10 examples of ways you could teach generative AI
An example of giving Microsoft Copilot a prompt.

One important thing to be aware of with AI generation tools is that they are all based on datasets. The data that the AI tool has been 'trained' on impacts what results you will get and what it can do. For example, generating images requires a large dataset of existing images and the AI will create images based on what it learns from these images, so certains styles or types of image in the dataset will mean you get particular results. The data that these tools are trained on might be copyrighted or someone's intellectual property (IP), which introduces other issues about what people do with the outputs of generative AI tools.

As we'll explore further on this page, think about what you use generative AI tools for. There is University guidance for students and postgraduate researchers around the use of generative AI in assessment and other work.

As we'll explore further on this page, it is also important to consider what information you are putting into a generative AI tool as part of your prompt or additional files you're providing it to work with. The University's guidance on using Information Classification can help you determine what level your information is, and has some guidance around AI usage with different classifications of data.

There are also things to consider if you are looking to use generative AI outside of a University context. Careers and Placements have put together some guidance for students around using AI as an applicant that explores attitudes towards using AI as part of applying for jobs.

What can generative AI tools help you with?

Tools powered by artificial intelligence are everywhere, and they are not just generative AI - for example, everyday applications like Gmail use a range of AI techniques to do things like highlight important emails or suggest how you might want to reply to an email. There's a whole range of tasks that AI has been used to help with, so often you'll have to explore the tools out there to see what might be possible.

Generative AI tools are being used in a huge range of ways. Text generation, for example, is being used to search for information, to create outlines and plans, to proofread text, to explore ideas, and many other things. People are constantly finding new ways to use (and abuse) these tools, so we couldn't ever write an exhaustive list of how you can use any type of generative AI! Always check any University, departmental, funding, and/or publisher/journal guidelines around what uses of generative AI are accessible for work that you might be submitting.

Tip

Be careful with the limitations of these tools. Always test what you want to do with the tool and the outputs you get before investing too much time in it. Lots of these tools are rapidly changing due to technological change, so keep an eye out for functionality changing. Also be aware of pricing models: many free tools have limitations like watermarks or limits on creation, as they want you to pay for the full version. Sometimes tools changing from being free to being paid for due to popularity, so make sure you download or export any creations once you've made them.

Developing ideas

One area that both text and image generation AI tools can be used for is developing ideas. You don't use the outputs of generative AI as any kind of final work, but instead as useful prompts for you to consider as you develop ideas for things like creative projects or standard documents (like asking what things people typically include in a cover letter, a lesson plan, etc). You then take these prompts and write/create your own content, critically evaluating which of the suggestions you got might be useful and which aren't. Particularly for creative projects, this can help you to go beyond your initial idea. For example, if you were creating a public-facing resource educating someone about generative AI, you could ask a text generation AI tool for a list of ideas of different kinds of resources you might make, or you could use an image generation AI tool to suggest possible design features.

Always bear in mind that this should not replace you critically evaluating what should be included in anything you are creating! Don't just follow the instructions the computer gives you - they often aren't correct, or don't match up with your specific context. If generative AI isn't helping you with ideas, try other methods for exploring ideas instead.

Searching

To explore more about what AI tools can do in terms of finding information and being used as a reference source, see our Searching for Information guide page on AI tools:

Considering how we view and use AI

How artificial intelligence is developed, used, and viewed in society are big areas of discussion and debate. The world of artificial intelligence has been changing a lot recently with developments making it possible to have better generative AI, so it can be useful to read around these topics and think more broadly about the societal impacts of AI as well as the technological ones. Lots of research is being done in these areas, including at the University of York.

AI ethics

There are many other ethical considerations in the world of AI. For example, some kinds of AI like machine learning can be based upon existing datasets, with the computer 'taught' from this existing data. What data is chosen as the training data is crucial: using datasets that contain inequality and bias will replicate those inequalities and biases in the AI tool. The data often contains copyrighted material or material that is someone's intellectual property, so they might not want people to be generating new work that is very similar to theirs (though copying work isn't something new to generative AI!).

When you use AI tools, it is good to be critical of artificial intelligence at the same time, and even how we view the tools themselves. Do we see them as 'magic' applications that can create something out of nothing, or complex code that has been designed and written by humans making choices? Does this make a difference to how we use the outputs of these tools? The 'people' side of generative AI is important: not just the people who create the tools, and the people whose work might have been used when the AI tool was trained, but also people whose work is more hidden, like those who label the huge datasets that generative AI is trained on. People run and work at the technology companies who build and sell these AI tools. Basically, there's a lot going on behind the scenes, and responsible use of AI means being aware of this!

There's also some of the wider impacts of generative AI, such as the climate impact and huge amount of computing power needed to train generative AI tools. You can search for articles and research on this area, but bear in mind it is fast-moving like all of generative AI is so there's new information and research coming out all the time.

What is appropriate to generate?

Alongside the many ethical questions around the use of generative AI tools more broadly, it is also important to consider each time you use a generative AI tool whether generative AI is the right way to get what you're looking for. This might be because you need to be careful not to do false authorship or anything else that is against the University's guidelines on using AI tools (for example, the student guidance on using AI and translation tools) or any other guidance you've been given for the task. It might be because you could actually do that task more easily with a different digital tool or application that doesn't use generative AI. And it might be because what you want to create shouldn't be something that has been generated.

For example, if you're looking represent a real-life issue, it might not be appropriate to generate an image or a written account of this issue, because this might misrepresent the issue (like adding something entirely unlikely or incorrect into the image or text) or appear fake or uncanny. You don't want to take something serious and have people with extra fingers or missing arms, for example, as that would distract the audience from the purpose of the image. Think critically about whether what you need should be accurate or not before using a generative AI tool to create it (and remember you can always not use what you generate if you think it might not best represent what you need).

Acknowledging the use of AI

One area relating to generative AI tools is how we might acknowledge our use of these tools. We might consider questions like:

  • How can I reference the use of generative AI in academic work, if using it in line with guidelines such as University of York's student guidance on using AI and translation tools?
  • How should we acknowledge the use of generative AI tools outside of academic work?
  • Do we need to be more transparent with acknowledging the use of generative AI generally? What benefits might this bring?

To know more about how you might reference generative AI tools in an academic context, see our referencing guide:

Why should we acknowledge the use of generative AI?

More generally, acknowledging when and how we have used generative AI can help to make it clear when these tools are and aren't used for different tasks. For example, if you credit any images you generate with an acknowledgement of the tool you used, it makes it clear that the image was created by AI, rather than a photograph of something real. Particularly for generated content that people might assume is real, doing this acknowledgement ensures you aren't unintentionally spreading misformation or misleading people about the source of something like an image.

In an educational context, stating when and how you have used generative AI tools is important for ensuring you avoid academic misconduct, as well as being transparent about your learning or research process and which tools, if any, you've used as part of that process.

As there is currently a lot of debate around the ethical issues surrounding generative AI, being transparent about when you have used generative AI also allows people to make informed decisions around how they engage with generated content or AI tools. For example, people might want to know if something was made by a human, an AI tool, or a combination/collaboration between the two.

Text generation AI tools

One of the most talked about areas of AI generation has been text generation. This involves tools that can take text prompts and generate new text in response, often in the form of a chatbot, a way of appearing to talk to the computer as it gives conversation-style outputs. ChatGPT was an early big name in the area, but there are lots of others now, including Microsoft Copilot and Google Gemini.

As with other kinds of AI generation, there are some important considerations with these tools. All of these tools, sometimes referred to as large language models or LLMs as this is the technology they use to work, require huge datasets of text, but what this text is and how up to date it is can impact the responses generated. When using text generation tools, it can be worth exploring further what that tool has been trained on and if it has different versions, to understand what you might be getting out of it.

There are also big questions around ethics and plagarism, some of which are being explored in current academic research. The use of data both for training these tools and the prompts that users give them can be a concern, with articles trying to help people know how to remove their data from ChatGPT, for example. Be critical about what information you give to any generative AI tool in the form of prompts - for example, don't feed your research data into them if you don't want that data to be publicly available.

At the University of York, we have access to the free version of Microsoft Copilot, and if you log in with your University username and password, Microsoft will not retain any prompts or responses from Copilot and that data will not be used to train the underlying model. See more on the IT Services guidance on Copilot:

The tools often advertise that they will produce "original content", but checking that would rely on people checking everything that the tool generates before using it. For lots of written content, the person who writes it is credited and it matters who writes it. Obviously, these kinds of tools can never be used for any academic work or anything where you want it to seem like your own work, as it won't be.

Text generation AI tools are interesting to consider in terms of what they say about work. Is writing copy, captions, video scripts, and more so unnecessary as a creative task that AI could generate it more efficiently? How might this impact copywriters' jobs? These tools often state that they are making copywriters' jobs easier rather than replacing them, but there's a long history of how technology, work, and human skills interact. The use of generative AI does not happen in a vacuum and we should consider the broader societal implications of any digital tools that we use.

A blue typewriter
AI text generation is similar to the old idea of monkeys typing Shakespeare - but this time they've been told about the works of Shakespeare and are trying to write a new version. Would that be Shakespeare's work or the monkeys' work?

Another way that these text generation tools are interesting are in how they can assist humans with their writing, rather than do it for them. You can generate content to spark ideas, give you a starting point, or help find words that mean something specific by asking the tool for a word that matches your definition. Again, it is important to consider what you do with the outputs of these tools and if using generative AI is actually the most useful tool for your task (sometimes a simple online search for synoyms of a word or a template for a certain type of thing would be quicker and easier).

Image generation AI tools

A popular kind of AI generation tools are ones that can create images from prompts, often applying particular criteria or using certain models that have particular design styles. These have been used for fun, to make stock images without needing to pay or find copyright-free pictures, and to test the limits of AI. There are a huge number of image generation AI tools out there and lots of other AI generation tools (and even image and design tools in general) have image generation built-in in some way now. Lots of image generation tools cost money, and others give you a limited number of 'credits' to use to make images.

More broadly, image-related AI tools are used for image recognition, which has a varied range of applications in areas like healthcare, environment, farming, and much more. There's a long history of "computer vision"!

A set of images generated by craiyon for the prompt 'AI generation tools' which all features an abstract brain or face of some kind
An example of AI generated images for the prompt "AI generation tools" created using craiyon (formerly DALL-E mini).

There are many tools out there that allow you to do similar things, from create images using prompts like the example above to uploading an image and using AI to change that image to be in a different style or more similar to a different image. These can be great for inspiration or fun as the results are often ridiculous and not what you were actually looking for.

For tools that require prompts, what you get out often depends on using the exact right terminology for that tool. There's a lot of trial and error involved, and be aware of biases in the data underlying these tools which can impact their results (for example, here's some of the limitations and biases explored for DALL-E mini).

For tools that create images in certain art styles, watch out for where these tools could be using the art styles of real, living artists. Use these tools critically and think about where they might be imitating living, working artists in ways they might not want. For example, they might be good for inspiration, but not something you would use on a final output.

The uncanny valley

If you use AI generators to create images or videos of people, these may suffer from the concept of the uncanny valley: when humanoid objects look just imperfect enough to a human that they provoke uncanny feelings of uneasiness. Have you ever seen a computer-generated person (e.g. in a video game) and they don't quite look right? Or felt unnerved by dolls or humanoid puppets or robots? This can be the uncanny valley effect.

When using AI generation tools, bear this idea in mind if you're creating images of people. You might make your imagery worse if you try to include a human and they don't look real enough. Consider generating more abstract images instead, though AI generators can often make even inanimate objects look a bit "wrong" and unnerving, or finding copyright-free stock images from sites like Pixabay and Unsplash instead.

Writing good prompts for generative AI

When writing prompts for use with generative AI tools, you often need to think carefully about the words you use to get an output that matches what you're hoping for. Prompt engineering is one way of describing the process of writing an instruction in a way that can be interpreted by a generative AI model, meaning that a generative AI tool is more likely to give you the response you are looking for. Prompts are the text that you give a generative AI tool to get a response, so prompt engineering is trying to "engineer" that prompt to get a response that matches what you hoped for.

Tip

Prompt engineering can also refer to the work that AI developers and researchers use to improve tools that use AI models, but often these strategies and guidance are similar to the methods you can use when using the tools themselves.

There are a range of methods for trying to get better results from generative AI tools using prompt engineering. Typically, you would start with an initial prompt and evaluate the usefulness of the generative AI's response. Then, you'll apply prompt engineering strategies to refine your prompt. You might try things like:

  • Give instructions rather than just asking a question or writing a phrase, so there's more detail about what output you want. For example, "Tell me who is the author of the book War and Peace and give me some biographical details about them".
  • Be specific in your instructions about what kind of output you want. You might want to give examples of similar outputs or state the length or format needed. For example: "Write a three sentence summary of the book War and Peace" or "Write a story with a similar plot to the book War and Peace"
  • Include an audience the output is for, or a 'persona' that you want the output to be written from the perspective of. For example: "Write a summary of the book War and Peace aimed at a ten-year-old" or "Write a review of the book War and Peace as if you hated it".

Writing prompts for each generative AI model and the tools based on it will differ, because these models and tools have different datasets and are designed in different ways. When using a generative AI tool you will start to learn what kinds of prompts work best with it, but when you use a different tool, this might change. For example, if you've used ChatGPT before, you might need to phrase your prompts for Copilot or Gemini differently to get the most effective outputs.

Prompt writing is a similar skill to writing good search terms when searching online, as both skills require you to put together text or instructions based on what you're looking for and to refine what you're asking based on the results you get. However, you will likely need to use different styles of prompts than you might be used to when searching online!

Forthcoming training sessions

Forthcoming sessions on :

Taught students
Staff
Researchers
Show details & booking for these sessions

There's more training events at:

Feedback
X